Fast computation with spikes in a recurrent neural network.
نویسندگان
چکیده
Neural networks with recurrent connections are sometimes regarded as too slow at computation to serve as models of the brain. Here we analytically study a counterexample, a network consisting of N integrate-and-fire neurons with self excitation, all-to-all inhibition, instantaneous synaptic coupling, and constant external driving inputs. When the inhibition and/or excitation are large enough, the network performs a winner-take-all computation for all possible external inputs and initial states of the network. The computation is done very quickly: As soon as the winner spikes once, the computation is completed since no other neurons will spike. For some initial states, the winner is the first neuron to spike, and the computation is done at the first spike of the network. In general, there are M potential winners, corresponding to the top M external inputs. When the external inputs are close in magnitude, M tends to be larger. If M>1, the selection of the actual winner is strongly influenced by the initial states. If a special relation between the excitation and inhibition is satisfied, the network always selects the neuron with the maximum external input as the winner.
منابع مشابه
Spiking neural network for recognizing spatiotemporal sequences of spikes.
Sensory neurons in many brain areas spike with precise timing to stimuli with temporal structures, and encode temporally complex stimuli into spatiotemporal spikes. How the downstream neurons read out such neural code is an important unsolved problem. In this paper, we describe a decoding scheme using a spiking recurrent neural network. The network consists of excitatory neurons that form a syn...
متن کاملGradient Descent for Spiking Neural Networks
Much of studies on neural computation are based on network models of static neurons that produce analog output, despite the fact that information processing in the brain is predominantly carried out by dynamic neurons that produce discrete pulses called spikes. Research in spike-based computation has been impeded by the lack of efficient supervised learning algorithm for spiking networks. Here,...
متن کاملFast convergence of spike sequences to periodic patterns in recurrent networks.
The dynamical attractors are thought to underlie many biological functions of recurrent neural networks. Here we show that stable periodic spike sequences with precise timings are the attractors of the spiking dynamics of recurrent neural networks with global inhibition. Almost all spike sequences converge within a finite number of transient spikes to these attractors. The convergence is fast, ...
متن کاملA New Recurrent Fuzzy Neural Network Controller Design for Speed and Exhaust Temperature of a Gas Turbine Power Plant
In this paper, a recurrent fuzzy-neural network (RFNN) controller with neural network identifier in direct control model is designed to control the speed and exhaust temperature of the gas turbine in a combined cycle power plant. Since the turbine operation in combined cycle unit is considered, speed and exhaust temperature of the gas turbine should be simultaneously controlled by fuel command ...
متن کاملEstimating State and Parameters in State Space Models of Spike Trains∗
Neural computations at all scales of evolutionary and behavioural complexity are carried out by recurrently connected networks of neurons that communicate with each other, with neurons elsewhere in the brain, and with muscles through the firing of action potentials or “spikes”. To understand how nervous tissue computes, it is therefore necessary to understand how the spiking of neurons is shape...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Physical review. E, Statistical, nonlinear, and soft matter physics
دوره 65 5 Pt 1 شماره
صفحات -
تاریخ انتشار 2002